Factorized Diffusion Map Approximation
نویسندگان
چکیده
Diffusion maps are among the most powerful Machine Learning tools to analyze and work with complex high-dimensional datasets. Unfortunately, the estimation of these maps from a finite sample is known to suffer from the curse of dimensionality. Motivated by other machine learning models for which the existence of structure in the underlying distribution of data can reduce the complexity of estimation, we study and show how the factorization of the underlying distribution into independent subspaces can help us to estimate diffusion maps more accurately. Building upon this result, we propose and develop an algorithm that can automatically factorize a high dimensional data space in order to minimize the error of estimation of its diffusion map, even in the case when the underlying distribution is not decomposable. Experiments on both the synthetic and real-world datasets demonstrate improved estimation performance of our method over the standard diffusion-map framework.
منابع مشابه
Implementation Schemes for the Factorized Quantum Lattice-Gas Algorithm for the One Dimensional Diffusion Equation using Persistent-Current Qubits
We present two experimental schemes that can be used to implement the Factorized Quantum Lattice-Gas Algorithm for the 1D Diffusion Equation with Persistent-Current Qubits. One scheme involves biasing the PC Qubit at multiple flux bias points throughout the course of the algorithm. An implementation analogous to that done in Nuclear Magnetic Resonance Quantum Computing is also developed. Errors...
متن کاملSelf-Organizing Sparse Codes
Sparse coding as applied to natural image patches learns Gabor-like components that resemble those found in the lower areas of the visual cortex. This biological motivation for sparse coding would also suggest that the learned receptive field elements be organized spatially by their response properties. However, the factorized prior in the original sparse coding model does not enforce this. We ...
متن کاملFactorized Asymptotic Bayesian Inference for Mixture Modeling
This paper proposes a novel Bayesian approximation inference method for mixture modeling. Our key idea is to factorize marginal log-likelihood using a variational distribution over latent variables. An asymptotic approximation, a factorized information criterion (FIC), is obtained by applying the Laplace method to each of the factorized components. In order to evaluate FIC, we propose factorize...
متن کاملImproving posterior marginal approximations in latent Gaussian models
We consider the problem of correcting the posterior marginal approximations computed by expectation propagation and Laplace approximation in latent Gaussian models and propose correction methods that are similar in spirit to the Laplace approximation of Tierney and Kadane (1986). We show that in the case of sparse Gaussian models, the computational complexity of expectation propagation can be m...
متن کاملImproving the Mean Field Approximation via the Use of Mixture Distributions
Mean eld methods provide computationally eecient approximations to posterior probability distributions for graphical models. Simple mean eld methods make a completely factorized approximation to the posterior, which is unlikely to be accurate when the posterior is multi-modal. Indeed, if the posterior is multi-modal, only one of the modes can be captured. To improve the mean eld approximation i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- JMLR workshop and conference proceedings
دوره 2012 شماره
صفحات -
تاریخ انتشار 2012